Last data update: May 06, 2024. (Total: 46732 publications since 2009)
Records 1-30 (of 49 Records) |
Query Trace: Williams DE[original query] |
---|
Evaluation of Ethiopia's field epidemiology training program - frontline: perspectives of implementing partners
Kebebew T , Woldetsadik MA , Barker J , Cui A , Abedi AA , Sugerman DE , Williams DE , Turcios-Ruiz RM , Takele T , Zeynu N . BMC Health Serv Res 2023 23 (1) 406 BACKGROUND: Field Epidemiology Training Program (FETP) has been adopted as an epidemiology and response capacity building strategy worldwide. FETP-Frontline was introduced in Ethiopia in 2017 as a three-month in-service training. In this study, we evaluated implementing partners' perspectives with the aim of understanding program effectiveness and identifying challenges and recommendations for improvement. METHODS: A qualitative cross-sectional design was utilized to evaluate Ethiopia's FETP-Frontline. Using a descriptive phenomenological approach, qualitative data were collected from FETP-Frontline implementing partners, including regional, zonal, and district health offices across Ethiopia. We collected data through in-person key informant interviews, using semi-structured questionnaires. Thematic analysis was conducted, assisted with MAXQDA, while ensuring interrater reliability by using the consistent application of theme categorization. The major themes that emerged were program effectiveness, knowledge and skills differences between trained and untrained officers, program challenges, and recommended actions for improvement. Ethical approval was obtained from the Ethiopian Public Health Institute. Informed written consent was obtained from all participants, and confidentiality of the data was maintained throughout. RESULTS: A total of 41 interviews were conducted with key informants from FETP-Frontline implementing partners. The regional and zonal level experts and mentors had a Master of Public Health (MPH), whereas district health managers were Bachelor of Science (BSc) holders. Majority of the respondents reflected a positive perception towards FETP-Frontline. Regional and zonal officers as well as mentors mentioned that there were observable performance differences between trained and untrained district surveillance officers. They also identified various challenges including lack of resources for transportation, budget constraints for field projects, inadequate mentorship, high staff turnover, limited number of staff at the district level, lack of continued support from stakeholders, and the need for refresher training for FETP-Frontline graduates. CONCLUSIONS: Implementing partners reflected a positive perception towards FETP-Frontline in Ethiopia. In addition to scaling-up the program to reach all districts to achieve the International Health Regulation 2005 goals, the program also needs to consider addressing immediate challenges, primarily lack of resources and poor mentorship. Continued monitoring of the program, refresher training, and career path development could improve retention of the trained workforce. |
Risk Factors for Ebola Virus Persistence in Semen of Survivors - Liberia.
Dyal J , Kofman A , Kollie JZ , Fankhauser J , Orone R , Soka MJ , Glaybo U , Kiawu A , Freeman E , Giah G , Tony HD , Faikai M , Jawara M , Kamara K , Kamara S , Flowers B , Kromah ML , Desamu-Thorpe R , Graziano J , Brown S , Morales-Betoulle ME , Cannon DL , Su K , Linderman SL , Plucinski M , Rogier E , Bradbury RS , Secor WE , Bowden KE , Phillips C , Carrington MN , Park YH , Martin MP , Del Pilar Aguinaga M , Mushi R , Haberling DL , Ervin ED , Klena JD , Massaquoi M , Nyenswah T , Nichol ST , Chiriboga DE , Williams DE , Hinrichs SH , Ahmed R , Vonhm BT , Rollin PE , Purpura LJ , Choi MJ . Clin Infect Dis 2022 76 (3) e849-e856 BACKGROUND: Long-term persistence of Ebola virus (EBOV) in immunologically-privileged sites has been implicated in recent outbreaks of Ebola Virus Disease (EVD) in Guinea and the Democratic Republic of Congo. This study was designed to understand how the acute course of EVD, convalescence, and host immune and genetic factors may play a role in prolonged viral persistence in semen. METHODS: A cohort of 131 male EVD survivors in Liberia were enrolled in a case-case study. "Early clearers" were defined as those with two consecutive negative EBOV semen tests by real-time reverse transcriptase polymerase chain reaction (rRT-PCR) at least two weeks apart within 1 year after discharge from the Ebola Treatment Unit (ETU) or acute EVD. "Late clearers" had detectable EBOV RNA by rRT-PCR over one year following ETU discharge or acute EVD. Retrospective histories of their EVD clinical course were collected by questionnaire, followed by complete physical exams and blood work. RESULTS: Compared to early clearers, late clearers were older (median 42.5 years, p = 0.0001) and experienced fewer severe clinical symptoms (median 2, p = 0.006). Late clearers had more lens opacifications (OR 3.9, 95%CI 1.1-13.3, p = 0.03), after accounting for age, higher total serum IgG3 titers (p = 0.007) and increased expression of the HLA-C*03:04 allele (OR 0.14, 95% CI 0.02-0.70, p = 0.007). CONCLUSIONS: Older age, decreased illness severity, elevated total serum IgG3 and HLA-C*03:04 allele expression may be risk factors for the persistence of EBOV in the semen of EVD survivors. EBOV persistence in semen may also be associated with its persistence in other immunologically protected sites, such as the eye. |
The impact of water sanitation and hygiene (WASH) improvements on hand hygiene at two Liberian hospitals during the recovery phase of an Ebola epidemic
Kanagasabai U , Enriquez K , Gelting R , Malpiedi P , Zayzay C , Kendor J , Fahnbulleh S , Cooper C , Gibson W , Brown R , Nador N , Williams DE , Chiriboga D , Niescierenko M . Int J Environ Res Public Health 2021 18 (7) Fourteen years of civil war left Liberia with crumbling infrastructure and one of the weakest health systems in the world. The 2014–2015 Ebola virus disease (EVD) outbreak exposed the vulnerabilities of the Liberian health system. Findings from the EVD outbreak highlighted the lack of infection prevention and control (IPC) practices, exacerbated by a lack of essential services such as water, sanitation, and hygiene (WASH) in healthcare facilities. The objective of this intervention was to improve IPC practice through comprehensive WASH renovations conducted at two hospitals in Liberia, prioritized by the Ministry of Health (MOH). The completion of renovations was tracked along with the impact of improvements on hand hygiene (HH) practice audits of healthcare workers pre-and post-intervention. An occurrence of overall HH practice was defined as the healthcare worker practicing compliant HH before and after the care for a single patient encounter. Liberia Government Hospital Bomi (LGH Bomi) and St. Timothy Government Hospital (St. Timothy) achieved World Health Organization (WHO) minimum global standards for environmental health in healthcare facilities as well as Liberian national standards. Healthcare worker (HCW) overall hand hygiene compliance improved from 36% (2016) to 89% (2018) at LGH Bomi hospital and from 86% (2016) to 88% (2018) at St. Timothy hospital. Improved WASH services and IPC practices in resource-limited healthcare settings are possible if significant holistic WASH infrastructure investments are made in these settings. |
Systems thinking for health emergencies: use of process mapping during outbreak response
Durski KN , Naidoo D , Singaravelu S , Shah AA , Djingarey MH , Formenty P , Ihekweazu C , Banjura J , Kebela B , Yinka-Ogunleye A , Fall IS , Eteng W , Vandi M , Keimbe C , Abubakar A , Mohammed A , Williams DE , Lamunu M , Briand S , Changa Changa JC , Minkoulou E , Jernigan D , Lubambo D , Khalakdina A , Mamadu I , Talisuna A , Mbule Kadiobo A , Jambai A , Aylward B , Osterholm M . BMJ Glob Health 2020 5 (10) Process mapping is a systems thinking approach used to understand, analyse and optimise processes within complex systems. We aim to demonstrate how this methodology can be applied during disease outbreaks to strengthen response and health systems. Process mapping exercises were conducted during three unique emerging disease outbreak contexts with different: mode of transmission, size, and health system infrastructure. System functioning improved considerably in each country. In Sierra Leone, laboratory testing was accelerated from 6 days to within 24 hours. In the Democratic Republic of Congo, time to suspected case notification reduced from 7 to 3 days. In Nigeria, key data reached the national level in 48 hours instead of 5 days. Our research shows that despite the chaos and complexities associated with emerging pathogen outbreaks, the implementation of a process mapping exercise can address immediate response priorities while simultaneously strengthening components of a health system. |
Characteristics of Ebola virus disease survivor blood and semen in Liberia: Serology and RT-PCR
Kofman A , Linderman S , Su K , Purpura LJ , Ervin E , Brown S , Morales-Betoulle M , Graziano J , Cannon DL , Klena JD , Desamu-Thorpe R , Fankhauser J , Orone R , Soka M , Glaybo U , Massaquoi M , Nysenswah T , Nichol ST , Kollie J , Kiawu A , Freeman E , Giah G , Tony H , Faikai M , Jawara M , Kamara K , Kamara S , Flowers B , Mohammed K , Chiriboga D , Williams DE , Hinrichs SH , Ahmed R , Vonhm B , Rollin PE , Choi MJ . Clin Infect Dis 2020 73 (11) e3641-e3646 INTRODUCTION: Ebola virus (EBOV), species Zaire ebolavirus, may persist in the semen of male survivors of Ebola Virus Disease (EVD). We conducted a study of male survivors of the 2014-2016 EVD outbreak in Liberia and evaluated their immune responses to EBOV. We report here findings from the serologic testing of blood for EBOV-specific antibodies, molecular testing for EBOV in blood and semen, and serologic testing of peripheral blood mononuclear cells (PBMCs) in a subset of study participants. METHODS: We tested for EBOV RNA in blood by qRT-PCR, and for anti-EBOV-specific IgM and IgG antibodies by enzyme-linked immunosorbent assay (ELISA) for 126 study participants. We performed peripheral blood mononuclear cell (PBMC) analysis on a subgroup of 26 IgG-negative participants. RESULTS: All 126 participants tested negative for EBOV RNA in blood by qRT-PCR. The blood of 26 participants tested negative for EBOV-specific IgG antibodies by ELISA. PBMCs were collected from 23/26 EBOV IgG-negative participants. Of these, 1/23 participants had PBMCs which produced anti-EBOV-specific IgG antibodies upon stimulation with EBOV-specific GP and NP antigens. DISCUSSION: The blood of EVD survivors, collected when they did not have symptoms meeting the case definition for acute or relapsed EVD, is unlikely to pose a risk for EBOV transmission. We identified one IgM/IgG negative participant who had PBMCs which produced anti-EBOV-specific antibodies upon stimulation. Immunogenicity following acute EBOV infection may exist along a spectrum and absence of antibody response should not be exclusionary in determining an individual's status as a survivor of EVD. |
Long-distance effects of epidemics: Assessing the link between the 2014 West Africa Ebola outbreak and U.S. exports and employment
Kostova D , Cassell CH , Redd JT , Williams DE , Singh T , Martel LD , Bunnell RE . Health Econ 2019 28 (11) 1248-1261 Although the economic consequences of epidemic outbreaks to affected areas are often well documented, little is known about how these might carry over into the economies of unaffected regions. In the absence of direct pathogen transmission, global trade is one mechanism through which geographically distant epidemics could reverberate to unaffected countries. This study explores the link between global public health events and U.S. economic outcomes by evaluating the role of the 2014 West Africa Ebola outbreak in U.S. exports and exports-supported U.S. jobs, 2005-2016. Estimates were obtained using difference-in-differences models where sub-Saharan Africa countries were assigned to treatment and comparison groups based on their Ebola transmission status, with controls for observed and unobserved time-variant factors that may independently influence trends in trade. Multiple model specification checks were performed to ensure analytic robustness. The year of peak transmission, 2014, was estimated to result in $1.08 billion relative reduction in U.S. merchandise exports to Ebola-affected countries, whereas estimated losses in exports-supported U.S. jobs exceeded 1,200 in 2014 and 11,000 in 2015. These findings suggest that remote disruptions in health security might play a role in U.S. economic indicators, demonstrating the interconnectedness between global health and aspects of the global economy and informing the relevance of health security efforts. |
Identifying high-risk individuals for chronic kidney disease: Results of the CHERISH Community Demonstration Project
Burrows NR , Vassalotti JA , Saydah SH , Stewart R , Gannon M , Chen SC , Li S , Pederson S , Collins AJ , Williams DE . Am J Nephrol 2018 48 (6) 447-455 BACKGROUND: Most people with chronic kidney disease (CKD) are not aware of their condition. OBJECTIVES: To assess screening criteria in identifying a population with or at high risk for CKD and to determine their level of control of CKD risk factors. METHOD: CKD Health Evaluation Risk Information Sharing (CHERISH), a demonstration project of the Centers for Disease Control and Prevention, hosted screenings at 2 community locations in each of 4 states. People with diabetes, hypertension, or aged >/=50 years were eligible to participate. In addition to CKD, screening included testing and measures of hemoglobin A1C, blood pressure, and lipids. -Results: In this targeted population, among 894 people screened, CKD prevalence was 34%. Of participants with diabetes, 61% had A1C < 7%; of those with hypertension, 23% had blood pressure < 130/80 mm Hg; and of those with high cholesterol, 22% had low-density lipoprotein < 100 mg/dL. CONCLUSIONS: Using targeted selection criteria and simple clinical measures, CHERISH successfully identified a population with a high CKD prevalence and with poor control of CKD risk factors. CHERISH may prove helpful to state and local programs in implementing CKD detection programs in their communities. |
Persistence of Ebola virus after the end of widespread transmission in Liberia: an outbreak report.
Dokubo EK , Wendland A , Mate SE , Ladner JT , Hamblion EL , Raftery P , Blackley DJ , Laney AS , Mahmoud N , Wayne-Davies G , Hensley L , Stavale E , Fakoli L , Gregory C , Chen TH , Koryon A , Roth Allen D , Mann J , Hickey A , Saindon J , Badini M , Baller A , Clement P , Bolay F , Wapoe Y , Wiley MR , Logue J , Dighero-Kemp B , Higgs E , Gasasira A , Williams DE , Dahn B , Kateh F , Nyenswah T , Palacios G , Fallah MP . Lancet Infect Dis 2018 18 (9) 1015-1024 BACKGROUND: Outbreak response efforts for the 2014-15 Ebola virus disease epidemic in west Africa brought widespread transmission to an end. However, subsequent clusters of infection have occurred in the region. An Ebola virus disease cluster in Liberia in November, 2015, that was identified after a 15-year-old boy tested positive for Ebola virus infection in Monrovia, raised the possibility of transmission from a persistently infected individual. METHODS: Case investigations were done to ascertain previous contact with cases of Ebola virus disease or infection with Ebola virus. Molecular investigations on blood samples explored a potential linkage between Ebola virus isolated from cases in this November, 2015, cluster and epidemiologically linked cases from the 2014-15 west African outbreak, according to the national case database. FINDINGS: The cluster investigated was the family of the index case (mother, father, three siblings). Ebola virus genomes assembled from two cases in the November, 2015, cluster, and an epidemiologically linked Ebola virus disease case in July, 2014, were phylogenetically related within the LB5 sublineage that circulated in Liberia starting around August, 2014. Partial genomes from two additional individuals, one from each cluster, were also consistent with placement in the LB5 sublineage. Sequencing data indicate infection with a lineage of the virus from a former transmission chain in the country. Based on serology and epidemiological and genomic data, the most plausible scenario is that a female case in the November, 2015, cluster survived Ebola virus disease in 2014, had viral persistence or recurrent disease, and transmitted the virus to three family members a year later. INTERPRETATION: Investigation of the source of infection for the November, 2015, cluster provides evidence of Ebola virus persistence and highlights the risk for outbreaks after interruption of active transmission. These findings underscore the need for focused prevention efforts among survivors and sustained capacity to rapidly detect and respond to new Ebola virus disease cases to prevent recurrence of a widespread outbreak. FUNDING: US Centers for Disease Control and Prevention, Defense Threat Reduction Agency, and WHO. |
Cross-Border Transmission of Ebola Virus as the Cause of a Resurgent Outbreak in Liberia in April 2016.
Mate SE , Wiley MR , Ladner JT , Dokubo EK , Fakoli L , Fallah M , Nyenswah TG , DiClaro JW , Deboer JT , Williams DE , Bolay F , Palacios G . Clin Infect Dis 2018 67 (7) 1147-1149 We present new information regarding an outbreak of Ebola virus (EBOV) disease (EVD) in Liberia in early 2016 that was associated with a resurgent outbreak (“flare-up”) in N’zérékoré, Guinea, described by Diallo et al [1]. During the course of the Guinean flare-up, 3 EVD cases were diagnosed in Monrovia, Liberia. We describe genomic and epidemiologic evidence demonstrating that the Liberian cases were the result of cross-border transmission from the N’zérékoré flare-up [1]. On 31 March 2016, an oropharyngeal swab sample from a deceased 30-year-old Liberian woman (patient A) tested positive for EBOV RNA by quantitative reverse-transcription polymerase chain reaction performed at the National Reference Laboratory in Liberia. Blood samples collected from her 2 children, 5-year-old and 2-year-old boys, also tested EBOV positive on 2 April (patient B) and 5 April (patient C) by quantitative reverse-transcription polymerase chain reaction. Genetic and epidemiologic investigations were initiated to distinguish among 3 potential modes of infection: (1) transmission from a persistently infected survivor within Liberia, (2) reintroduction from active transmission of EBOV ongoing in Guinea, and (3) an independent spillover from a nonhuman reservoir. |
Race/ethnicity, dietary acid load, and risk of end-stage renal disease among US adults with chronic kidney disease
Crews DC , Banerjee T , Wesson DE , Morgenstern H , Saran R , Burrows NR , Williams DE , Powe NR . Am J Nephrol 2018 47 (3) 174-181 BACKGROUND: Dietary acid load (DAL) contributes to the risk of CKD and CKD progression. We sought to determine the relation of DAL to racial/ethnic differences in the risk of end-stage renal disease (ESRD) among persons with CKD. METHODS: Among 1,123 non-Hispanic black (NHB) and non-Hispanic white (NHW) National Health and Nutrition Examination Survey III participants with estimated glomerular filtration rate 15-59 mL/min/1.73 m2, DAL was estimated using the Remer and Manz net acid excretion (NAEes) formula and 24-h dietary recall. ESRD events were ascertained via linkage with Medicare. A competing risk model (accounting for death) was used to estimate the hazard ratio (HR) for treated ESRD, comparing NHBs with NHWs, adjusting for demographic, clinical and nutritional factors (body surface area, total caloric intake, serum bicarbonate, protein intake), and NAEes. Additionally, whether the relation of NAEes with ESRD risk varied by race/ethnicity was tested. RESULTS: At baseline, NHBs had greater NAEes (50.9 vs. 44.2 mEq/day) than NHWs. It was found that 22% developed ESRD over a median of 7.5 years. The unadjusted HR comparing NHBs to NHWs was 3.35 (95% CI 2.51-4.48) and adjusted HR (for factors above) was 1.68 (95% CI 1.18-2.38). A stronger association of NAE with risk of ESRD was observed among NHBs (adjusted HR per mEq/day increase in NAE 1.21, 95% CI 1.12-1.31) than that among NHWs (HR 1.08, 95% CI 0.96-1.20), p interaction for race/ethnicity x NAEes = 0.004. CONCLUSIONS: Among US adults with CKD, the association of DAL with progression to ESRD is stronger among NHBs than NHWs. DAL is worthy of further investigation for its contribution to kidney outcomes across race/ethnic groups. |
Rapid laboratory identification of Neisseria meningitidis serogroup C as the cause of an outbreak - Liberia, 2017
Patel JC , George J , Vuong J , Potts CC , Bozio C , Clark TA , Thomas J , Schier J , Chang A , Waller JL , Diaz MH , Whaley M , Jenkins LT , Fuller S , Williams DE , Redd JT , Arthur RR , Taweh F , Vera Walker Y , Hardy P , Freeman M , Katawera V , Gwesa G , Gbanya MZ , Clement P , Kohar H , Stone M , Fallah M , Nyenswah T , Winchell JM , Wang X , McNamara LA , Dokubo EK , Fox LM . MMWR Morb Mortal Wkly Rep 2017 66 (42) 1144-1147 On April 25, 2017, a cluster of unexplained illness and deaths among persons who had attended a funeral during April 21-22 was reported in Sinoe County, Liberia (1). Using a broad initial case definition, 31 cases were identified, including 13 (42%) deaths. Twenty-seven cases were from Sinoe County (1), and two cases each were from Grand Bassa and Monsterrado counties, respectively. On May 5, 2017, initial multipathogen testing of specimens from four fatal cases using the Taqman Array Card (TAC) assay identified Neisseria meningitidis in all specimens. Subsequent testing using direct real-time polymerase chain reaction (PCR) confirmed N. meningitidis in 14 (58%) of 24 patients with available specimens and identified N. meningitidis serogroup C (NmC) in 13 (54%) patients. N. meningitidis was detected in specimens from 11 of the 13 patients who died; no specimens were available from the other two fatal cases. On May 16, 2017, the National Public Health Institute of Liberia and the Ministry of Health of Liberia issued a press release confirming serogroup C meningococcal disease as the cause of this outbreak in Liberia. |
Associations between persistent organic pollutants, type 2 diabetes, diabetic nephropathy and mortality
Grice BA , Nelson RG , Williams DE , Knowler WC , Mason C , Hanson RL , Bullard KM , Pavkov ME . Occup Environ Med 2017 74 (7) 521-527 OBJECTIVE: Relationships were examined between persistent organic pollutants (POPs) and incident type 2 diabetes, end-stage renal disease (ESRD) and mortality. METHODS: In a nested case-control study, 300 persons without diabetes had baseline examinations between 1969 and 1974; 149 developed diabetes (cases) and 151 remained non-diabetic (controls) during 8.0 and 23.1 years of follow-up, respectively. POPs were measured at baseline. ORs for diabetes were computed by logistic regression analysis. The cases were followed from diabetes onset to ESRD, death or 2013. HRs for ESRD and mortality were computed by cause-specific hazard models. Patterns of association were explored using principal components analysis. RESULTS: PCB151 increased the odds for incident diabetes, whereas hexachlorobenzene (HCB) was protective after adjusting for age, sex, body mass index, sample storage characteristics, glucose and lipid levels. Associations between incident diabetes and polychlorinatedbiphenyl (PCB) or persistent pesticide (PST) components were mostly positive but non-significant. Among the cases, 29 developed ESRD and 48 died without ESRD. PCB28, PCB49 and PCB44 increased the risk of ESRD after adjusting for baseline demographic and clinical characteristics. Several PCBs and PSTs increased the risk of death without ESRD. The principal components analysis identified PCBs with low-chlorine load positively associated with ESRD and death without ESRD, and several PSTs associated with death without ESRD. CONCLUSIONS: Most POPs were positively but not significantly associated with incident diabetes. PCB151 was significantly predictive and HCB was significantly protective for diabetes. Among participants with diabetes, low-chlorine PCBs increase the risk of ESRD and death without ESRD, whereas several PSTs predict death without ESRD. |
The cost-effectiveness of anemia treatment for persons with chronic kidney disease
Yarnoff BO , Hoerger TJ , Simpson SA , Pavkov ME , Burrows NR , Shrestha SS , Williams DE , Zhuo X . PLoS One 2016 11 (7) e0157323 BACKGROUND: Although major guidelines uniformly recommend iron supplementation and erythropoietin stimulating agents (ESAs) for managing chronic anemia in persons with chronic kidney disease (CKD), there are differences in the recommended hemoglobin (Hb) treatment target and no guidelines consider the costs or cost-effectiveness of treatment. In this study, we explored the most cost-effective Hb target for anemia treatment in persons with CKD stages 3-4. METHODS AND FINDINGS: The CKD Health Policy Model was populated with a synthetic cohort of persons over age 30 with prevalent CKD stages 3-4 (i.e., not on dialysis) and anemia created from the 1999-2010 National Health and Nutrition Examination Survey. Incremental cost-effectiveness ratios (ICERs), computed as incremental cost divided by incremental quality adjusted life years (QALYs), were assessed for Hb targets of 10 g/dl to 13 g/dl at 0.5 g/dl increments. Targeting a Hb of 10 g/dl resulted in an ICER of $32,111 compared with no treatment and targeting a Hb of 10.5 g/dl resulted in an ICER of $32,475 compared with a Hb target of 10 g/dl. QALYs increased to 4.63 for a Hb target of 10 g/dl and to 4.75 for a target of 10.5 g/dl or 11 g/dl. Any treatment target above 11 g/dl increased medical costs and decreased QALYs. CONCLUSIONS: In persons over age 30 with CKD stages 3-4, anemia treatment is most cost-effective when targeting a Hb level of 10.5 g/dl. This study provides important information for framing guidelines related to treatment of anemia in persons with CKD. |
Timing of initiation of maintenance dialysis: a qualitative analysis of the electronic medical records of a national cohort of patients from the Department of Veterans Affairs
Wong SP , Vig EK , Taylor JS , Burrows NR , Liu CF , Williams DE , Hebert PL , O'Hare AM . JAMA Intern Med 2016 176 (2) 228-35 IMPORTANCE: There is often considerable uncertainty about the optimal time to initiate maintenance dialysis in individual patients and little medical evidence to guide this decision. Objective: To gain a better understanding of the factors influencing the timing of initiation of dialysis in clinical practice. DESIGN, SETTING, AND PARTICIPANTS: A qualitative analysis was conducted using the electronic medical records from the Department of Veterans Affairs (VA) of a national random sample of 1691 patients for whom the decision to initiate maintenance dialysis occurred in the VA between January 1, 2000, and December 31, 2009. Data analysis took place from June 1 to November 30, 2014. MAIN OUTCOMES AND MEASURES: Central themes related to the timing of initiation of dialysis as documented in patients' electronic medical records. Results: Of the 1691 patients, 1264 (74.7%) initiated dialysis as inpatients and 1228 (72.6%) initiated dialysis with a hemodialysis catheter. Cohort members met with a nephrologist during an outpatient clinic visit a median of 3 times (interquartile range, 0-6) in the year prior to initiation of dialysis. The mean (SD) estimated glomerular filtration rate at the time of initiation for cohort members was 10.4 (5.7) mL/min/1.73 m2. The timing of initiation of dialysis reflected the complex interplay of at least 3 interrelated and dynamic processes. The first was physician practices, which ranged from practices intended to prepare patients for dialysis to those intended to forestall the need for dialysis by managing the signs and symptoms of uremia with medical interventions. The second process was sources of momentum. Initiation of dialysis was often precipitated by clinical events involving acute illness or medical procedures. In these settings, the imperative to treat often seemed to override patient choice. The third process was patient-physician dynamics. Interactions between patients and physicians were sometimes adversarial, and physician recommendations to initiate dialysis sometimes seemed to conflict with patient priorities. CONCLUSIONS AND RELEVANCE: The initiation of maintenance dialysis reflects the care practices of individual physicians, sources of momentum for initiation of dialysis, interactions between patients and physicians, and the complex interplay of these dynamic processes over time. Our findings suggest opportunities to improve communication between patients and physicians and to better align these processes with patients' values, goals, and preferences. |
Acute kidney injury recovery pattern and subsequent risk of CKD: an analysis of Veterans Health Administration data
Heung M , Steffick DE , Zivin K , Gillespie BW , Banerjee T , Hsu CY , Powe NR , Pavkov ME , Williams DE , Saran R , Shahinian VB . Am J Kidney Dis 2015 67 (5) 742-52 BACKGROUND: Studies suggest an association between acute kidney injury (AKI) and long-term risk for chronic kidney disease (CKD), even following apparent renal recovery. Whether the pattern of renal recovery predicts kidney risk following AKI is unknown. STUDY DESIGN: Retrospective cohort. SETTING & PARTICIPANTS: Patients in the Veterans Health Administration in 2011 hospitalized (>24 hours) with at least 2 inpatient serum creatinine measurements, baseline estimated glomerular filtration rate > 60mL/min/1.73m2, and no diagnosis of end-stage renal disease or non-dialysis-dependent CKD: 17,049 (16.3%) with and 87,715 without AKI. PREDICTOR: Pattern of recovery to creatinine level within 0.3mg/dL of baseline after AKI: within 2 days (fast), in 3 to 10 days (intermediate), and no recovery by 10 days (slow or unknown). OUTCOME: CKD stage 3 or higher, defined as 2 outpatient estimated glomerular filtration rates < 60mL/min/1.73m2 at least 90 days apart or CKD diagnosis, dialysis therapy, or transplantation. MEASUREMENTS: Risk for CKD was modeled using modified Poisson regression and time to death-censored CKD was modeled using Cox proportional hazards regression, both stratified by AKI stage. RESULTS: Most patients' AKI episodes were stage 1 (91%) and 71% recovered within 2 days. At 1 year, 18.2% had developed CKD (AKI, 31.8%; non-AKI, 15.5%; P<0.001). In stage 1, the adjusted relative risk ratios for CKD stage 3 or higher were 1.43 (95% CI, 1.39-1.48), 2.00 (95% CI, 1.88-2.12), and 2.65 (95% CI, 2.51-2.80) for fast, intermediate, and slow/unknown recovery. A similar pattern was observed in subgroup analyses incorporating albuminuria and sensitivity analysis of death-censored time to CKD. LIMITATIONS: Variable timing of follow-up and mostly male veteran cohort may limit generalizability. CONCLUSIONS: Patients who develop AKI during a hospitalization are at substantial risk for the development of CKD by 1 year following hospitalization and timing of AKI recovery is a strong predictor, even for the mildest forms of AKI. |
Nephrology care prior to end-stage renal disease and outcomes among new ESRD patients in the USA
Gillespie BW , Morgenstern H , Hedgeman E , Tilea A , Scholz N , Shearon T , Burrows NR , Shahinian VB , Yee J , Plantinga L , Powe NR , McClellan W , Robinson B , Williams DE , Saran R . Clin Kidney J 2015 8 (6) 772-780 BACKGROUND: Longer nephrology care before end-stage renal disease (ESRD) has been linked with better outcomes. METHODS: We investigated whether longer pre-end-stage renal disease (ESRD) nephrology care was associated with lower mortality at both the patient and state levels among 443 761 incident ESRD patients identified in the USA between 2006 and 2010. RESULTS: Overall, 33% of new ESRD patients had received no prior nephrology care, while 28% had received care for >12 months. At the patient level, predictors of >12 months of nephrology care included having health insurance, white race, younger age, diabetes, hypertension and US region. Longer pre-ESRD nephrology care was associated with lower first-year mortality (adjusted hazard ratio = 0.58 for >12 months versus no care; 95% confidence interval 0.57-0.59), higher albumin and hemoglobin, choice of peritoneal dialysis and native fistula and discussion of transplantation options. Living in a state with a 10% higher proportion of patients receiving >12 months of pre-ESRD care was associated with a 9.3% lower relative mortality rate, standardized for case mix (R 2 = 0.47; P < 0.001). CONCLUSIONS: This study represents the largest cohort of incident ESRD patients to date. Although we did not follow patients before ESRD onset, our findings, both at the individual patient and state levels, reflect the importance of early nephrology care among those with chronic kidney disease. |
Molecular Evidence of Sexual Transmission of Ebola Virus.
Mate SE , Kugelman JR , Nyenswah TG , Ladner JT , Wiley MR , Cordier-Lassalle T , Christie A , Schroth GP , Gross SM , Davies-Wayne GJ , Shinde SA , Murugan R , Sieh SB , Badio M , Fakoli L , Taweh F , de Wit E , van Doremalen N , Munster VJ , Pettitt J , Prieto K , Humrighouse BW , Stroher U , DiClaro JW , Hensley LE , Schoepp RJ , Safronetz D , Fair J , Kuhn JH , Blackley DJ , Laney AS , Williams DE , Lo T , Gasasira A , Nichol ST , Formenty P , Kateh FN , De Cock KM , Bolay F , Sanchez-Lockhart M , Palacios G . N Engl J Med 2015 373 (25) 2448-54 A suspected case of sexual transmission from a male survivor of Ebola virus disease (EVD) to his female partner (the patient in this report) occurred in Liberia in March 2015. Ebola virus (EBOV) genomes assembled from blood samples from the patient and a semen sample from the survivor were consistent with direct transmission. The genomes shared three substitutions that were absent from all other Western African EBOV sequences and that were distinct from the last documented transmission chain in Liberia before this case. Combined with epidemiologic data, the genomic analysis provides evidence of sexual transmission of EBOV and evidence of the persistence of infective EBOV in semen for 179 days or more after the onset of EVD. (Funded by the Defense Threat Reduction Agency and others.). |
Potential impact of prescribing metformin according to eGFR rather than serum creatinine
Tuot DS , Lin F , Shlipak MG , Grubbs V , Hsu CY , Yee J , Shahinian V , Saran R , Saydah S , Williams DE , Powe NR . Diabetes Care 2015 38 (11) 2059-67 OBJECTIVE: Many societies recommend using estimated glomerular filtration rate (eGFR) rather than serum creatinine (sCr) to determine metformin eligibility. We examined the potential impact of these recommendations on metformin eligibility among U.S. adults. RESEARCH DESIGN AND METHODS: Metformin eligibility was assessed among 3,902 adults with diabetes who participated in the 1999-2010 National Health and Nutrition Examination Surveys and reported routine access to health care, using conventional sCr thresholds (eligible if <1.4 mg/dL for women and <1.5 mg/dL for men) and eGFR categories: likely safe, ≥45 mL/min/1.73 m2; contraindicated, <30 mL/min/1.73 m2; and indeterminate, 30-44 mL/min/1.73 m2). Different eGFR equations were used: four-variable MDRD, Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) creatinine (CKD-EPIcr), and CKD-EPI cystatin C, as well as Cockcroft-Gault (CG) to estimate creatinine clearance (CrCl). Diabetes was defined by self-report or A1C ≥6.5% (48 mmol/mol). We used logistic regression to identify populations for whom metformin was likely safe adjusted for age, race/ethnicity, and sex. Results were weighted to the U.S. adult population. RESULTS: Among adults with sCr above conventional cutoffs, MDRD eGFR ≥45 mL/min/1.73 m2 was most common among men (adjusted odds ratio [aOR] 33.3 [95% CI 7.4-151.5] vs. women) and non-Hispanic Blacks (aOR vs. whites 14.8 [4.27-51.7]). No individuals with sCr below conventional cutoffs had an MDRD eGFR <30 mL/min/1.73 m2. All estimating equations expanded the population of individuals for whom metformin is likely safe, ranging from 86,900 (CKD-EPIcr) to 834,800 (CG). All equations identified larger populations with eGFR 30-44 mL/min/1.73 m2, for whom metformin safety is indeterminate, ranging from 784,700 (CKD-EPIcr) to 1,636,000 (CG). CONCLUSIONS: The use of eGFR or CrCl to determine metformin eligibility instead of sCr can expand the adult population with diabetes for whom metformin is likely safe, particularly among non-Hispanic blacks and men. |
CKD as a model for improving chronic disease care through electronic health records
Drawz PE , Archdeacon P , McDonald CJ , Powe NR , Smith KA , Norton J , Williams DE , Patel UD , Narva A . Clin J Am Soc Nephrol 2015 10 (8) 1488-99 Electronic health records have the potential to improve the care of patients with chronic medical conditions. CKD provides a unique opportunity to show this potential: the disease is common in the United States, there is significant room to improve CKD detection and management, CKD and its related conditions are defined primarily by objective laboratory data, CKD care requires collaboration by a diverse team of health care professionals, and improved access to CKD-related data would enable identification of a group of patients at high risk for multiple adverse outcomes. However, to realize the potential for improvement in CKD-related care, electronic health records will need to provide optimal functionality for providers and patients and interoperability across multiple health care settings. The goal of the National Kidney Disease Education Program Health Information Technology Working Group is to enable and support the widespread interoperability of data related to kidney health among health care software applications to optimize CKD detection and management. Over the course of the last 2 years, group members met to identify general strategies for using electronic health records to improve care for patients with CKD. This paper discusses these strategies and provides general goals for appropriate incorporation of CKD-related data into electronic health records and corresponding design features that may facilitate (1) optimal care of individual patients with CKD through improved access to clinical information and decision support, (2) clinical quality improvement through enhanced population management capabilities, (3) CKD surveillance to improve public health through wider availability of population-level CKD data, and (4) research to improve CKD management practices through efficiencies in study recruitment and data collection. Although these strategies may be most effectively applied in the setting of CKD, because it is primarily defined by laboratory abnormalities and therefore, an ideal computable electronic health record phenotype, they may also apply to other chronic diseases. |
Trends in Emergency Department Visit Rates for Hypoglycemia and Hyperglycemic Crisis among Adults with Diabetes, United States, 2006-2011
Wang J , Geiss LS , Williams DE , Gregg EW . PLoS One 2015 10 (8) e0134917 BACKGROUND: Despite concerns about hypoglycemia events from overly aggressive glycemic reduction, population trends in hypoglycemia and hyperglycemic crisis incidence are unclear. To address this gap, we examined changes in emergency department (ED) visit rates for hypoglycemia and hyperglycemic crisis 2006-2011. METHODS: Using data from the Nationwide Emergency Department Sample, we estimated the number of ED visits for hypoglycemia and hyperglycemic crisis via ICD-9-CM among adults with diabetes. Using data from the National Health Interview Survey, we estimated the population of adults with diabetes and calculated ED visit rates. RESULTS: From 2006 to 2011, ED visit rates for hypoglycemia declined by 22% from 1.8 to 1.4 per 100 adults (p = 0.003). The rates decreased in all age groups (all P<0.05) except those aged 18 to 44 years (P = 0.31). Hypoglycemia rates displayed a J-shaped curve across age, with the highest rates among adults aged 75 years or older (P <0.001). ED visit rates for hyperglycemic crisis did not change overall but increased 17% for adults aged 65 to 74 years (P = 0.02) and 29% for women (P = 0.01). Hyperglycemic crisis rates were highest among adults aged 18 to 44 years (P <0.001). CONCLUSIONS: Hypoglycemia rates have declined for all adults but persons aged 18-44 years while rates for hyperglycemic crisis remained stable. Future preventive efforts should target on the susceptible population of adults aged 18 to 44 years and those aged 75 years or older. |
Possible sexual transmission of Ebola virus - Liberia, 2015
Christie A , Davies-Wayne GJ , Cordier-Lasalle T , Blackley DJ , Laney AS , Williams DE , Shinde SA , Badio M , Lo T , Mate SE , Ladner JT , Wiley MR , Kugelman JR , Palacios G , Holbrook MR , Janosko KB , Wit Ed , Doremalen Nv , Munster VJ , Pettitt J , Schoepp RJ , Verhenne L , Evlampidou I , Kollie KK , Sieh SB , Gasasira A , Bolay F , Kateh FN , Nyenswah TG , De Cock KM . MMWR Morb Mortal Wkly Rep 2015 64 (17) 479-81 On March 20, 2015, 30 days after the most recent confirmed Ebola Virus Disease (Ebola) patient in Liberia was isolated, Ebola was laboratory confirmed in a woman in Monrovia. The investigation identified only one epidemiologic link to Ebola: unprotected vaginal intercourse with a survivor. Published reports from previous outbreaks have demonstrated Ebola survivors can continue to harbor virus in immunologically privileged sites for a period of time after convalescence. Ebola virus has been isolated from semen as long as 82 days after symptom onset and viral RNA has been detected in semen up to 101 days after symptom onset. One instance of possible sexual transmission of Ebola has been reported, although the accompanying evidence was inconclusive. In addition, possible sexual transmission of Marburg virus, a filovirus related to Ebola, was documented in 1968. This report describes the investigation by the Government of Liberia and international response partners of the source of Liberia's latest Ebola case and discusses the public health implications of possible sexual transmission of Ebola virus. Based on information gathered in this investigation, CDC now recommends that contact with semen from male Ebola survivors be avoided until more information regarding the duration and infectiousness of viral shedding in body fluids is known. If male survivors have sex (oral, vaginal, or anal), a condom should be used correctly and consistently every time. |
High dietary acid load predicts ESRD among adults with CKD
Banerjee T , Crews DC , Wesson DE , Tilea AM , Saran R , Rios-Burrows N , Williams DE , Powe NR . J Am Soc Nephrol 2015 26 (7) 1693-700 Small clinical trials have shown that a reduction in dietary acid load (DAL) improves kidney injury and slows kidney function decline; however, the relationship between DAL and risk of ESRD in a population-based cohort with CKD remains unexamined. We examined the association between DAL, quantified by net acid excretion (NAEes), and progression to ESRD in a nationally representative sample of adults in the United States. Among 1486 adults with CKD age≥20 years enrolled in the National Health and Nutrition Examination Survey III, DAL was determined by 24-h dietary recall questionnaire. The development of ESRD was ascertained over a median 14.2 years of follow-up through linkage with the Medicare ESRD Registry. We used the Fine-Gray competing risks method to estimate the association of high, medium, and low DAL with ESRD after adjusting for demographics, nutritional factors, clinical factors, and kidney function/damage markers and accounting for intervening mortality events. In total, 311 (20.9%) participants developed ESRD. Higher levels of DAL were associated with increased risk of ESRD; relative hazards (95% confidence interval) were 3.04 (1.58 to 5.86) for the highest tertile and 1.81 (0.89 to 3.68) for the middle tertile compared with the lowest tertile in the fully adjusted model. The risk of ESRD associated with DAL tertiles increased as eGFR decreased (P trend=0.001). Among participants with albuminuria, high DAL was strongly associated with ESRD risk (P trend=0.03). In conclusion, high DAL in persons with CKD is independently associated with increased risk of ESRD in a nationally representative population. |
The future burden of CKD in the United States: a simulation model for the CDC CKD initiative
Hoerger TJ , Simpson SA , Yarnoff BO , Pavkov ME , Rios Burrows N , Saydah SH , Williams DE , Zhuo X . Am J Kidney Dis 2014 65 (3) 403-11 BACKGROUND: Awareness of chronic kidney disease (CKD), defined by kidney damage or reduced glomerular filtration rate, remains low in the United States, and few estimates of its future burden exist. STUDY DESIGN: We used the CKD Health Policy Model to simulate the residual lifetime incidence of CKD and project the prevalence of CKD in 2020 and 2030. The simulation sample was based on nationally representative data from the 1999 to 2010 National Health and Nutrition Examination Surveys. SETTING & POPULATION: Current US population. MODEL, PERSPECTIVE, & TIMELINE: Simulation model following up individuals from current age through death or age 90 years. OUTCOMES: Residual lifetime incidence represents the projected percentage of persons who will develop new CKD during their lifetimes. Future prevalence is projected for 2020 and 2030. MEASUREMENTS: Development and progression of CKD are based on annual decrements in estimated glomerular filtration rates that depend on age and risk factors. RESULTS: For US adults aged 30 to 49, 50 to 64, and 65 years or older with no CKD at baseline, the residual lifetime incidences of CKD are 54%, 52%, and 42%, respectively. The prevalence of CKD in adults 30 years or older is projected to increase from 13.2% currently to 14.4% in 2020 and 16.7% in 2030. LIMITATIONS: Due to limited data, our simulation model estimates are based on assumptions about annual decrements in estimated glomerular filtration rates. CONCLUSIONS: For an individual, lifetime risk of CKD is high, with more than half the US adults aged 30 to 64 years likely to develop CKD. Knowing the lifetime incidence of CKD may raise individuals' awareness and encourage them to take steps to prevent CKD. From a national burden perspective, we estimate that the population prevalence of CKD will increase in coming decades, suggesting that development of interventions to slow CKD onset and progression should be considered. |
Dietary acid load and chronic kidney disease among adults in the United States
Banerjee T , Crews DC , Wesson DE , Tilea A , Saran R , Rios Burrows N , Williams DE , Powe NR . BMC Nephrol 2014 15 137 BACKGROUND: Diet can markedly affect acid-base status and it significantly influences chronic kidney disease (CKD) and its progression. The relationship of dietary acid load (DAL) and CKD has not been assessed on a population level. We examined the association of estimated net acid excretion (NAE(es)) with CKD; and socio-demographic and clinical correlates of NAE(es). METHODS: Among 12,293 U.S. adult participants aged >20 years in the National Health and Nutrition Examination Survey 1999-2004, we assessed dietary acid by estimating NAE(es) from nutrient intake and body surface area; kidney damage by albuminuria; and kidney dysfunction by eGFR < 60 ml/min/1.73 m(2) using the MDRD equation. We tested the association of NAE(es) with participant characteristics using median regression; while for albuminuria, eGFR, and stages of CKD we used logistic regression. RESULTS: Median regression results (beta per quintile) indicated that adults aged 40-60 years (beta [95% CI] = 3.1 [0.3-5.8]), poverty (beta [95% CI] = 7.1 [4.01-10.22]), black race (beta [95% CI] = 13.8 [10.8-16.8]), and male sex (beta [95% CI] = 3.0 [0.7- 5.2]) were significantly associated with an increasing level of NAE(es). Higher levels of NAE(es) compared with lower levels were associated with greater odds of albuminuria (OR [95% CI] = 1.57 [1.20-2.05]). We observed a trend toward greater NAE(es) being associated with higher risk of low eGFR, which persisted after adjustment for confounders. CONCLUSION: Higher NAE(es) is associated with albuminuria and low eGFR, and socio-demographic risk factors for CKD are associated with higher levels of NAE(es). DAL may be an important target for future interventions in populations at high risk for CKD. |
Does knowing one's elevated glycemic status make a difference in macronutrient intake?
Bardenheier BH , Cogswell ME , Gregg EW , Williams DE , Zhang Z , Geiss LS . Diabetes Care 2014 37 (12) 3143-9 OBJECTIVE: To determine whether macronutrient intake differs by awareness of glycemic status among people with diabetes and prediabetes. RESEARCH DESIGN AND METHODS: We used 24-h dietary recall and other data from 3,725 nonpregnant adults with diabetes or prediabetes aged ≥20 years from the morning fasting sample of the 2005-2010 National Health and Nutrition Examination Surveys. Diabetes and prediabetes awareness were self-reported; those unaware of diabetes and prediabetes were defined by fasting plasma glucose (FPG) ≥126 mg/dL or HbA1c ≥6.5% and FPG 100-125 mg/dL or HbA1c of 5.7%-6.4%, respectively. Components of nutrient intake on a given day assessed were total calories, sugar, carbohydrates, fiber, protein, fat, and total cholesterol, stratified by sex and glycemic status awareness. Estimates of nutrient intake were adjusted for age, race/ethnicity, education level, BMI, smoking status, and family history of diabetes. RESULTS: Men with diagnosed diabetes consumed less sugar (mean 86.8 vs. 116.8 g) and carbohydrates (mean 235.0 vs. 262.1 g) and more protein (mean 92.3 vs. 89.7 g) than men with undiagnosed diabetes. Similarly, women with diagnosed diabetes consumed less sugar (mean 79.1 vs. 95.7 g) and more protein (mean 67.4 vs. 56.6 g) than women with undiagnosed diabetes. No significant differences in macronutrient intake were found by awareness of prediabetes. All participants, regardless of sex or glycemic status, consumed on average less than the American Diabetes Association recommendations for fiber intake (i.e., 14 g/1,000 kcal) and slightly more saturated fat than recommended (>10% of total kilocalories). CONCLUSIONS: Screening and subsequent knowledge of glycemic status may favorably affect some dietary patterns for people with diabetes. |
Taming the chronic kidney disease epidemic: a global view of surveillance efforts
Radhakrishnan J , Remuzzi G , Saran R , Williams DE , Rios-Burrows N , Powe N , Bruck K , Wanner C , Stel VS , Venuthurupalli SK , Hoy WE , Healy HG , Salisbury A , Fassett RG , O'Donoghue D , Roderick P , Matsuo S , Hishida A , Imai E , Iimuro S . Kidney Int 2014 86 (2) 246-50 Chronic kidney disease is now recognized to be a worldwide problem associated with significant morbidity and mortality and there is a steep increase in the number of patients reaching end-stage renal disease. In many parts of the world, the disease affects younger people without diabetes or hypertension. The costs to family and society can be enormous. Early recognition of CKD may help prevent disease progression and the subsequent decline in health and longevity. Surveillance programs for early CKD detection are beginning to be implemented in a few countries. In this article, we will focus on the challenges and successes of these programs with the hope that their eventual and widespread use will reduce the complications, deaths, disabilities, and economic burdens associated with CKD worldwide. |
Changes in diabetes-related complications in the United States, 1990-2010
Gregg EW , Li Y , Wang J , Burrows NR , Ali MK , Rolka D , Williams DE , Geiss L . N Engl J Med 2014 370 (16) 1514-23 BACKGROUND: Preventive care for adults with diabetes has improved substantially in recent decades. We examined trends in the incidence of diabetes-related complications in the United States from 1990 through 2010. METHODS: We used data from the National Health Interview Survey, the National Hospital Discharge Survey, the U.S. Renal Data System, and the U.S. National Vital Statistics System to compare the incidences of lower-extremity amputation, end-stage renal disease, acute myocardial infarction, stroke, and death from hyperglycemic crisis between 1990 and 2010, with age standardized to the U.S. population in the year 2000. RESULTS: Rates of all five complications declined between 1990 and 2010, with the largest relative declines in acute myocardial infarction (-67.8%; 95% confidence interval [CI], -76.2 to -59.3) and death from hyperglycemic crisis (-64.4%; 95% CI, -68.0 to -60.9), followed by stroke and amputations, which each declined by approximately half (-52.7% and -51.4%, respectively); the smallest decline was in end-stage renal disease (-28.3%; 95% CI, -34.6 to -21.6). The greatest absolute decline was in the number of cases of acute myocardial infarction (95.6 fewer cases per 10,000 persons; 95% CI, 76.6 to 114.6), and the smallest absolute decline was in the number of deaths from hyperglycemic crisis (-2.7; 95% CI, -2.4 to -3.0). Rate reductions were larger among adults with diabetes than among adults without diabetes, leading to a reduction in the relative risk of complications associated with diabetes. When expressed as rates for the overall population, in which a change in prevalence also affects complication rates, there was a decline in rates of acute myocardial infarction and death from hyperglycemic crisis (2.7 and 0.1 fewer cases per 10,000, respectively) but not in rates of amputation, stroke, or end-stage renal disease. CONCLUSIONS: Rates of diabetes-related complications have declined substantially in the past two decades, but a large burden of disease persists because of the continued increase in the prevalence of diabetes. (Funded by the Centers for Disease Control and Prevention.). |
Trends in incidence of end-stage renal disease among persons with diagnosed diabetes - Puerto Rico, 1996-2010
Burrows NR , Hora I , Williams DE , Geiss LS . MMWR Morb Mortal Wkly Rep 2014 63 (9) 186-9 During 2010, approximately 6,091 persons aged ≥18 years in Puerto Rico were living with end-stage renal disease (ESRD) (i.e., kidney failure that requires regular dialysis or kidney transplantation for survival). This included 1,462 persons who began treatment for ESRD in 2010. Diabetes is the leading cause of ESRD in Puerto Rico, accounting for 66% of new cases in adults, followed by hypertension, which accounts for 15% of the cases. Although the number of adults initiating ESRD treatment (i.e., dialysis or kidney transplantation) in Puerto Rico each year who have diabetes listed as a primary cause (ESRD-D) has increased since 1996, ESRD-D incidence among adults with diagnosed diabetes has not shown a consistent trend. To assess recent trends in ESRD-D incidence among adults aged ≥18 years in Puerto Rico with diagnosed diabetes and to further examine trends by age group and sex, CDC analyzed 1996-2010 data from the U.S. Renal Data System (USRDS) and the Behavioral Risk Factor Surveillance System (BRFSS). After increasing in the late 1990s, ESRD-D incidence decreased during the 2000s among adult men and among persons aged 18-44 years with diagnosed diabetes in Puerto Rico. Throughout the period, ESRD-D incidence among adult women and among persons aged 45-64 and ≥75 years with diagnosed diabetes did not show a consistent trend, and ESRD-D incidence among persons aged 65-74 years with diagnosed diabetes increased. Increased awareness of the risk factors for kidney disease and implementation of effective interventions to prevent or delay kidney disease among persons with diagnosed diabetes might decrease ESRD incidence in Puerto Rico, particularly among women and older persons. |
Effect of food insecurity on chronic kidney disease in lower-income Americans
Crews DC , Kuczmarski MF , Grubbs V , Hedgeman E , Shahinian VB , Evans MK , Zonderman AB , Burrows NR , Williams DE , Saran R , Powe NR . Am J Nephrol 2014 39 (1) 27-35 BACKGROUND: The relation of food insecurity (inability to acquire nutritionally adequate and safe foods) and chronic kidney disease (CKD) is unknown. We examined whether food insecurity is associated with prevalent CKD among lower-income individuals in both the general US adult population and an urban population. METHODS: We conducted cross-sectional analyses of lower-income participants of the National Health and Nutrition Examination Survey (NHANES) 2003-2008 (n = 9,126) and the Healthy Aging in Neighborhoods of Diversity across the Life Span (HANDLS) study (n = 1,239). Food insecurity was defined based on questionnaires and CKD was defined by reduced estimated glomerular filtration rate or albuminuria; adjustment was performed with multivariable logistic regression. RESULTS: In NHANES, the age-adjusted prevalence of CKD was 20.3, 17.6, and 15.7% for the high, marginal, and no food insecurity groups, respectively. Analyses adjusting for sociodemographics and smoking status revealed high food insecurity to be associated with greater odds of CKD only among participants with either diabetes (OR = 1.67, 95% CI: 1.14-2.45 comparing high to no food insecurity groups) or hypertension (OR = 1.37, 95% CI: 1.03-1.82). In HANDLS, the age-adjusted CKD prevalence was 5.9 and 4.6% for those with and without food insecurity, respectively (p = 0.33). Food insecurity was associated with a trend towards greater odds of CKD (OR = 1.46, 95% CI: 0.98-2.18) with no evidence of effect modification across diabetes, hypertension, or obesity subgroups. CONCLUSION: Food insecurity may contribute to disparities in kidney disease, especially among persons with diabetes or hypertension, and is worthy of further study. |
Prevalence, characteristics and clinical diagnosis of maturity onset diabetes of the young due to mutations in HNF1A, HNF4A, and glucokinase: results from the SEARCH for Diabetes in Youth.
Pihoker C , Gilliam LK , Ellard S , Dabelea D , Davis C , Dolan LM , Greenbaum CJ , Imperatore G , Lawrence JM , Marcovina SM , Mayer-Davis E , Rodriguez BL , Steck AK , Williams DE , Hattersley AT . J Clin Endocrinol Metab 2013 98 (10) 4055-62 AIMS: Our study aims were to determine the frequency of MODY mutations (HNF1A, HNF4A, glucokinase) in a diverse population of youth with diabetes and to assess how well clinical features identify youth with maturity-onset diabetes of the young (MODY). METHODS: The SEARCH for Diabetes in Youth study is a US multicenter, population-based study of youth with diabetes diagnosed at age younger than 20 years. We sequenced genomic DNA for mutations in the HNF1A, HNF4A, and glucokinase genes in 586 participants enrolled in SEARCH between 2001 and 2006. Selection criteria included diabetes autoantibody negativity and fasting C-peptide levels of 0.8 ng/mL or greater. RESULTS: We identified a mutation in one of three MODY genes in 47 participants, or 8.0% of the tested sample, for a prevalence of at least 1.2% in the pediatric diabetes population. Of these, only 3 had a clinical diagnosis of MODY, and the majority was treated with insulin. Compared with the MODY-negative group, MODY-positive participants had lower FCP levels (2.2 +/- 1.4 vs 3.2 +/- 2.1 ng/mL, P < .01) and fewer type 2 diabetes-like metabolic features. Parental history of diabetes did not significantly differ between the 2 groups. CONCLUSIONS/INTERPRETATION: In this systematic study of MODY in a large pediatric US diabetes cohort, unselected by referral pattern or family history, MODY was usually misdiagnosed and incorrectly treated with insulin. Although many type 2 diabetes-like metabolic features were less common in the mutation-positive group, no single characteristic identified all patients with mutations. Clinicians should be alert to the possibility of MODY diagnosis, particularly in antibody-negative youth with diabetes. |
- Page last reviewed:Feb 1, 2024
- Page last updated:May 06, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure